Information Theory and Statistical Physics
نویسنده
چکیده
Relationships between information theory and statistical physics have been recognized over the last few decades. One such aspect is identifying structures of optimization problems pertaining to certain information-theoretic settings and drawing analogy to parallel structures arising in statistical physics, and then borrowing statistical mechanical insights, as well as powerful analysis techniques (like the replica method) from statistical physics to the dual informationtheoretic setting of interest. Another aspect is the application of the maximum entropy principle, which emerged in statistical mechanics and treating it as a general guiding principle to problems in information theory e.g. signal processing,speech coding ,spectrum estimation. In the reverse side, we can consider statistical mechanics as a form of statistical reference. Information theory gives us a constructive criterion of setting up probability distributions on the basis of partial knowledge, leading to maximum-entropy estimate. So, the usual rules in statistical physics are an immediate consequence of maximum-entropy principle. The facts about maximization of entropy were stated by Gibbs much early, but this property was were treated as side remarks, not providing any justification for the methods of statistical mechanics [2]. This missing ”feature” has been supplied by information theory. So, now entropy can be taken as a starting point of concept, and the fact that a probability distribution maximizes the entropy subject to certain constraints becomes an essential fact which justifies use of that distribution for inferences. Landauers erasure principle [6] provides a powerful link between information theory and physics. Information processing or even the storage of information leads to entropy.As per the principle,the erasure of every bit of information increases the thermodynamic entropy of the world by k ln 2, where k is Boltzmann’s constant(1.38 x 10−23 J/K), and so, this suggests very strong link between the two areas. The report will look at some of the parallels between these two areas. First it discusses results of [4] where he gives an analogy of the information inequality, and Data Processing Theorem(DPT) with the second law of thermodynamics. DPT is used in many proofs of converses of theorems in information theory, hence the roots of fundamental limits of Information Theory can be attributed to the laws of physics. Then it discusses a result [3] about Chernoff bounds and
منابع مشابه
Analysis of Resting-State fMRI Topological Graph Theory Properties in Methamphetamine Drug Users Applying Box-Counting Fractal Dimension
Introduction: Graph theoretical analysis of functional Magnetic Resonance Imaging (fMRI) data has provided new measures of mapping human brain in vivo. Of all methods to measure the functional connectivity between regions, Linear Correlation (LC) calculation of activity time series of the brain regions as a linear measure is considered the most ubiquitous one. The strength of the dependence obl...
متن کاملInformation Theory and Statistical Physics - Lecture Notes
This document consists of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, as well as to graduate students in Physics who have basic background in Information Theory. Strong emphasis is given to the analogy and parallelism betwe...
متن کاملStatistical Physics and Information Theory
This is a set of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, as well as to graduate students in Physics who have basic background in Information Theory. Strong emphasis is given to the analogy and parallelism between Inform...
متن کاملLecture Notes on Information Theory and Statistical Physics
As the title tells, this paper is based on lecture notes of a graduate course, which focuses on the relations between information theory and statistical physics. The course was delivered at the Technion during the Spring of 2010 for the first time, and its target audience consists of EE graduate students in the area of communications and information theory, as well as graduate students in Physi...
متن کاملFisher Information and Statistical Mechanics
Fisher information is an important concept in statistical estimation theory and information theory, but it has received relatively little consideration in statistical physics. In order to rectify this oversight, in this brief note I will review the correspondence between Fisher information and uctuations at thermodynamic equilibrium, and discuss various applications of Fisher information to equ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009